41 research outputs found

    Prevalence, Treatment, and Outcomes of Coexistent Pulmonary Hypertension and Interstitial Lung Disease in Systemic Sclerosis

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/150609/1/art40862.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/150609/2/art40862_am.pd

    Minimal Clinically Important Differences for the Modified Rodnan Skin Score: Results from the Scleroderma Lung Studies (SLS-I and SLS-II)

    Get PDF
    Abstract Objective This study aimed to assess the minimal clinically important differences (MCIDs) for the modified Rodnan skin score (mRSS) using combined data from the Scleroderma Lung Studies (I and II). Methods MCID estimates for the mRSS at 12 months were calculated using three anchors: change in scores on the Health Assessment Questionnaire- Disability Index from baseline to 12 months, change in scores on the Patient Global Assessment from baseline to 12 months, and answer at 12 month for the Short Form-36 health transition question “Compared to one year ago, how would you rate your health in general now?” We determined the mRSS MCID estimates for all participants and for those with diffuse cutaneous systemic sclerosis (dcSSc). We then assessed associations between MCID estimates of mRSS improvement and patient-reported outcomes, using Student’s t test to compare the mean differences in patient outcomes between those who met the MCID improvement criteria versus those who did not meet the improvement criteria. Results The mean (SD) mRSS at baseline was 14.75 (10.72) for all participants and 20.93 (9.61) for those with dcSSc. The MCID estimate for mRSS improvement at 12 months ranged from 3 to 4 units for the overall group (improvement of 20–27% from baseline) and was 5 units for those with dcSSc (improvement of 24% from baseline). Those who met the mRSS MCID improvement criteria had statistically significant improvements in scores on the Short Form-36 Physical Component Summary, the Transition Dyspnea Index, and joint contractures at 12 months. Conclusion MCID estimates for the mRSS were 3–4 units for all participants and 5 units for those with dcSSc. These findings are consistent with previously reported MCID estimates for systemic sclerosis.https://deepblue.lib.umich.edu/bitstream/2027.42/147346/1/13075_2019_Article_1809.pd

    The effect of the top 20 Alzheimer disease risk genes on gray-matter density and FDG PET brain metabolism

    Get PDF
    INTRODUCTION: We analyzed the effects of the top 20 Alzheimer disease (AD) risk genes on gray-matter density (GMD) and metabolism. METHODS: We ran stepwise linear regression analysis using posterior cingulate hypometabolism and medial temporal GMD as outcomes and all risk variants as predictors while controlling for age, gender, and APOE ε4 genotype. We explored the results in 3D using Statistical Parametric Mapping 8. RESULTS: Significant predictors of brain GMD were SLC24A4/RIN3 in the pooled and mild cognitive impairment (MCI); ZCWPW1 in the MCI; and ABCA7, EPHA1, and INPP5D in the AD groups. Significant predictors of hypometabolism were EPHA1 in the pooled, and SLC24A4/RIN3, NME8, and CD2AP in the normal control group. DISCUSSION: Multiple variants showed associations with GMD and brain metabolism. For most genes, the effects were limited to specific stages of the cognitive continuum, indicating that the genetic influences on brain metabolism and GMD in AD are complex and stage dependent

    Associations of the Top 20 Alzheimer Disease Risk Variants With Brain Amyloidosis

    Get PDF
    Importance: Late-onset Alzheimer disease (AD) is highly heritable. Genome-wide association studies have identified more than 20 AD risk genes. The precise mechanism through which many of these genes are associated with AD remains unknown. Objective: To investigate the association of the top 20 AD risk variants with brain amyloidosis. Design, Setting, and Participants: This study analyzed the genetic and florbetapir F 18 data from 322 cognitively normal control individuals, 496 individuals with mild cognitive impairment, and 159 individuals with AD dementia who had genome-wide association studies and 18F-florbetapir positron emission tomographic data from the Alzheimer's Disease Neuroimaging Initiative (ADNI), a prospective, observational, multisite tertiary center clinical and biomarker study. This ongoing study began in 2005. Main Outcomes and Measures: The study tested the association of AD risk allele carrier status (exposure) with florbetapir mean standard uptake value ratio (outcome) using stepwise multivariable linear regression while controlling for age, sex, and apolipoprotein E ε4 genotype. The study also reports on an exploratory 3-dimensional stepwise regression model using an unbiased voxelwise approach in Statistical Parametric Mapping 8 with cluster and significance thresholds at 50 voxels and uncorrected P < .01. Results: This study included 977 participants (mean [SD] age, 74 [7.5] years; 535 [54.8%] male and 442 [45.2%] female) from the ADNI-1, ADNI-2, and ADNI-Grand Opportunity. The adenosine triphosphate-binding cassette subfamily A member 7 (ABCA7) gene had the strongest association with amyloid deposition (χ2 = 8.38, false discovery rate-corrected P < .001), after apolioprotein E ε4. Significant associations were found between ABCA7 in the asymptomatic and early symptomatic disease stages, suggesting an association with rapid amyloid accumulation. The fermitin family homolog 2 (FERMT2) gene had a stage-dependent association with brain amyloidosis (FERMT2 × diagnosis χ2 = 3.53, false discovery rate-corrected P = .05), which was most pronounced in the mild cognitive impairment stage. Conclusions and Relevance: This study found an association of several AD risk variants with brain amyloidosis. The data also suggest that AD genes might differentially regulate AD pathologic findings across the disease stages

    Mortality and Morbidity during Extreme Heat Events and Prevalence of Outdoor Work: An Analysis of Community-Level Data from Los Angeles County, California

    No full text
    Heat is a well-recognized hazard for workers in many outdoor settings, yet fewinvestigations have compared the prevalence of outdoor work at the community level and rates of heat-related mortality and morbidity. This analysis examines whether heat-related health outcomes occur more frequently in communities with higher proportions of residents working in construction,agriculture, and other outdoor industries. Using 2005–2010 data from Los Angeles County, California, we analyze associations between community-level rates of deaths, emergency department (ED) visits, and hospitalizations during summer heat events and the prevalence of outdoor work. We findgenerally higher rates of heat-related ED visits and hospitalizations during summer heat events in communities with more residents working outdoors. Specifically, each percentage increase in residents working in construction resulted in an 8.1 percent increase in heat-related ED visits and a 7.9 percent increase in heat-related hospitalizations, while each percentage increase in residents working in agriculture and related sectors resulted in a 10.9 percent increase in heat-related ED visits.The findings suggest that outdoor work may significantly influence the overall burden of heat-related morbidity at the community level. Public health professionals and healthcare providers should recognize work and employment as significant heat risk factors when preparing for and respondingto extreme heat events

    Remote Monitoring of Patients With Hematologic Malignancies at High Risk of Febrile Neutropenia: Exploratory Study.

    No full text
    BackgroundFebrile neutropenia is one of the most common oncologic emergencies and is associated with significant, preventable morbidity and mortality. Most patients who experience a febrile neutropenia episode are hospitalized, resulting in significant economic cost.ObjectiveThis exploratory study implemented a remote monitoring system comprising a digital infrared thermometer and a pulse oximeter with the capability to notify providers in real time of abnormalities in vital signs that could suggest early clinical deterioration and thereby improve clinical outcomes.MethodsThe remote monitoring system was implemented and compared to standard-of-care vital signs monitoring in hospitalized patients with underlying hematologic malignancies complicated by a febrile neutropenia episode in order to assess the feasibility and validity of the system. Statistical analysis was performed using the intraclass correlation coefficient (ICC) to assess the consistency between the measurements taken using traditional methods and those taken with the remote monitoring system for each of the vital sign parameters (temperature, heart rate, and oxygen saturation). A linear mixed-effects model with a random subject effect was used to estimate the variance components. Bland-Altman plots were created for the parameters to further delineate the direction of any occurring bias.ResultsA total of 23 patients were enrolled in the study (mean age 56, SD 23-75 years; male patients: n=11, 47.8%). ICC analysis confirmed the high repeatability and accuracy of the heart rate assessment (ICC=0.856), acting as a supplement to remote temperature assessment. While the sensitivity and specificity for capturing tachycardia above a rate of 100 bpm were excellent (88% and 97%, respectively), the sensitivity of the remote monitoring system in capturing temperatures &gt;37.8 °C and oxygen saturation &lt;92% was 45% and 50%, respectively.ConclusionsOverall, this novel approach using temperature, heart rate, and oxygen saturation assessments successfully provided real-time, clinically valuable feedback to providers. While temperature and oxygen saturation assessments lagged in terms of sensitivity compared to a standard in-hospital system, the heart rate assessment provided highly accurate complementary data. As a whole, the system provided additional information that can be applied to a clinically vulnerable population. By transitioning its application to high-risk patients in the outpatient setting, this system can help prevent additional use of health care services through early provider intervention and potentially improve outcomes

    Outcomes and Prognostic Factors of Pulmonary Hypertension Patients Undergoing Emergent Endotracheal Intubation.

    No full text
    Background: Emergent endotracheal intubations (ETI) in pulmonary hypertension (PH) patients are associated with increased mortality. Post-intubation interventions that could increase survivability in this population have not been explored. We evaluate early clinical characteristics and complications following emergent endotracheal intubation and seek predictors of adverse outcomes during this post-intubation period. Methods: Retrospective cohort analysis of adult patients with groups 1 and 3 PH who underwent emergent intubation between 2005-2021 in medical and liver transplant ICUs at a tertiary medical center. PH patients were compared to non-PH patients, matched by Charlson Comorbidity Index. Primary outcomes were 24-h post-intubation and inpatient mortalities. Various 24-h post-intubation secondary outcomes were compared between PH and control cohorts. Results: We identified 48 PH and 110 non-PH patients. Pulmonary hypertension was not associated with increased 24-h mortality (OR 1.32, 95%CI 0.35-4.94, P = .18), but was associated with inpatient mortality (OR 4.03, 95%CI 1.29-12.5, P = .016) after intubation. Within 24 h post-intubation, PH patients experienced more frequent acute kidney injury (43.5% vs. 19.8%, P = .006) and required higher norepinephrine dosing equivalents (6.90 [0.13-10.6] mcg/kg/min, vs. 0.20 [0.10-2.03] mcg/kg/min, P = .037). Additionally, the median P/F ratio (PaO2/FiO2) was lower in PH patients (96.3 [58.9-201] vs. 233 [146-346] in non-PH, P = .001). Finally, a post-intubation increase in PaCO2 was associated with mortality in the PH cohort (post-intubation change in PaCO2 +5.14 ± 16.1 in non-survivors vs. -18.7 ± 28.0 in survivors, P = .007). Conclusions: Pulmonary hypertension was associated with worse outcomes after emergent endotracheal intubation than similar patients without PH. More importantly, our data suggest that the first 24 hours following intubation in the PH group represent a particularly vulnerable period that may determine long-term outcomes. Early post-intubation interventions may be key to improving survival in this population

    Prevalent and Incident Vertebral Deformities in Midlife Women: Results from the Study of Women's Health Across the Nation (SWAN).

    No full text
    BACKGROUND:Vertebral fractures are the most common type of osteoporotic fracture among women, but estimates of their prevalence and incidence during middle-age are limited. The development of vertebral morphometry (VM) using dual energy X-ray absorptiometry (DXA) makes it more feasible to measure VM in large, longitudinal, observational studies. We conducted this study to: 1) contribute to the scant knowledge of the prevalence, incidence and risk factors for vertebral deformities in middle-aged women; and 2) to evaluate the performance of DXA-based VM measurement in a large, community based sample. METHODS:The sample is derived from the Study of Women's Health Across the Nation (SWAN), a multi-site, community-based, longitudinal cohort study of the MT. Using Hologic QDR 4500A instruments, we acquired initial VM measurements in 1446 women during calendar years 2004-2007; in 2012-2013, a follow-up VM was obtained in 1108. Annually, lumbar spine (LS) and femoral neck (FN) bone mineral density (BMD) were measured and participant characteristics were assessed with standardized instruments. Multivariable logistic regression models examined the relations between prevalent deformity and relevant characteristics. Analyses of characteristics associated with prevalent deformity were restricted to 824 women who had not taken bone active medications since SWAN baseline. We calculated incident deformity per person year (PY) of observation, standardized to 1000 person-years. RESULTS:The cranial portion of the VM image yielded the lowest proportions of readable vertebrae: from T4 through T6, between 43% and 63% of vertebral bodies were evaluable. Greater BMI was associated with fewer readable levels (B = -0.088, p<0.0001). In the baseline sample of 1446 women, the prevalence of vertebral deformity was 3.2% (95% CI: 2.3, 4.1). The relative odds of deformity increased by 61% per SD decrement in baseline LS BMD (p = 0.02) and were 67% greater per SD decrement in baseline FN BMD (p = 0.04). Odds of prevalent deformity increased by 21% per year increment in age (p = 0.02). On average, 1108 women were followed for 6.8 years (SD 0.5 years, range 5.1-8.3 years) and we observed an incidence of 1.98 vertebral deformities per 1000 PY. In the longitudinal sample, 628 participants had never used bone active medications; their vertebral deformity incidence was 2.8 per 1000 PY. CONCLUSION:Prevalence of vertebral deformity in SWAN participants aged 50-60 years was low and lower bone density at the LS and FN was strongly related to greater risk of prevalent deformity. Only about half of the vertebral levels between T4-T6 could be adequately imaged by DXA. Greater BMI is associated with fewer readable vertebral levels
    corecore